Convolutional Neural Network for Roadside Barriers Detection: Transfer Learning versus Non-Transfer Learning

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning Document Image Features With SqueezeNet Convolutional Neural Network

The classification of various document images is considered an important step towards building a modern digital library or office automation system. Convolutional Neural Network (CNN) classifiers trained with backpropagation are considered to be the current state of the art model for this task. However, there are two major drawbacks for these classifiers: the huge computational power demand for...

متن کامل

Pruning Convolutional Neural Networks for Resource Efficient Transfer Learning

We propose a new framework for pruning convolutional kernels in neural networks to enable efficient inference, focusing on transfer learning where large and potentially unwieldy pretrained networks are adapted to specialized tasks. We interleave greedy criteria-based pruning with fine-tuning by backpropagation—a computationally efficient procedure that maintains good generalization in the prune...

متن کامل

Transfer Learning with Deep Convolutional Neural Network for SAR Target Classification with Limited Labeled Data

Tremendous progress has been made in object recognition with deep convolutional neural networks (CNNs), thanks to the availability of large-scale annotated dataset. With the ability of learning highly hierarchical image feature extractors, deep CNNs are also expected to solve the Synthetic Aperture Radar (SAR) target classification problems. However, the limited labeled SAR target data becomes ...

متن کامل

CsMTL MLP For WEKA: Neural Network Learning with Inductive Transfer

We present context-sensitive Multiple Task Learning, or csMTL as a method of inductive transfer embedded in the well known WEKA machine learning suite. csMTL uses a single output neural network and additional contextual inputs for learning multiple tasks. Inductive transfer occurs from secondary tasks to the model for the primary task so as to improve its predictive performance. The WEKA multi-...

متن کامل

Transfer Learning for Neural Semantic Parsing

The goal of semantic parsing is to map natural language to a machine interpretable meaning representation language (MRL). One of the constraints that limits full exploration of deep learning technologies for semantic parsing is the lack of sufficient annotation training data. In this paper, we propose using sequence-to-sequence in a multi-task setup for semantic parsing with a focus on transfer...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Signals

سال: 2021

ISSN: 2624-6120

DOI: 10.3390/signals2010007